Kernel Density Based Linear Regression Estimate
نویسندگان
چکیده
For linear regression models with non-normally distributed errors, the least squares estimate (LSE) will lose some efficiency compared to the maximum likelihood estimate (MLE). In this article, we propose a kernel density based regression estimate (KDRE) that is adaptive to the unknown error distribution. The key idea is to approximate the likelihood function by using a nonparametric kernel density estimate of the error density based on some initial parameter estimate. The proposed estimate is shown to be asymptotically as efficient as the oracle MLE which assumes the error density were known. In addition, we propose an EM type algorithm to maximize the estimated likelihood function and show that the KDRE can be considered as an iterated weighted least squares estimate, which provides us some insights on the adaptiveness of KDRE to the unknown error distribution. Our Monte Carlo simulation studies show that, while comparable to the traditional LSE for normal errors, the proposed estimation procedure can have substantial efficiency gain for non-normal errors. Moreover, the efficiency gain can be achieved even for a small sample size.
منابع مشابه
Nonparametric Conditional Density Estimation Using Piecewise-Linear Solution Path of Kernel Quantile Regression
The goal of regression analysis is to describe the stochastic relationship between an input vector x and a scalar output y. This can be achieved by estimating the entire conditional density p(y / x). In this letter, we present a new approach for nonparametric conditional density estimation. We develop a piecewise-linear path-following method for kernel-based quantile regression. It enables us t...
متن کاملRobust Estimation in Linear Regression Model: the Density Power Divergence Approach
The minimum density power divergence method provides a robust estimate in the face of a situation where the dataset includes a number of outlier data. In this study, we introduce and use a robust minimum density power divergence estimator to estimate the parameters of the linear regression model and then with some numerical examples of linear regression model, we show the robustness of this est...
متن کاملImproved fast Gauss transform User manual
In most kernel based machine learning algorithms and non-parametric statistics the key computational task is to compute a linear combination of local kernel functions centered on the training data, i.e., f(x) = ∑N i=1 qik(x, xi), which is the discrete Gauss transform for the Gaussian kernel. f is the regression/classification function in case of regularized least squares, Gaussian process regre...
متن کاملA New Regression Model: Modal Linear Regression
The mode of a distribution provides an important summary of data and is often estimated based on some non-parametric kernel density estimator. This article develops a new data analysis tool called modal linear regression in order to explore highdimensional data. Modal linear regression models the conditional mode of a response Y given a set of predictors x as a linear function of x. Modal linea...
متن کاملA Sparse Kernel Density Estimation Algorithm Using Forward Constrained Regression
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the...
متن کامل